# Multi-industry Adaptation
Roberta Base Zeroshot V2.0 C
MIT
A zero-shot classification model based on the RoBERTa architecture, designed for text classification tasks without requiring training data, supports both GPU and CPU operation, and is trained using fully business-friendly data.
Text Classification
Transformers English

R
MoritzLaurer
3,188
4
Deberta V3 Large Zeroshot V2.0 C
MIT
A DeBERTa-v3-large model specifically designed for efficient zero-shot classification, trained on fully commercially friendly synthetic data and NLI datasets, supporting GPU/CPU inference
Text Classification
Transformers English

D
MoritzLaurer
1,560
20
Featured Recommended AI Models